Kullback-Leibler Boosting

نویسندگان

  • Ce Liu
  • Harry Shum
چکیده

In this paper, we develop a general classification framework called Kullback-Leibler Boosting, or KLBoosting. KLBoosting has following properties. First, classification is based on the sum of histogram divergences along corresponding global and discriminating linear features. Second, these linear features, called KL features, are iteratively learnt by maximizing the projected Kullback-Leibler divergence in a boosting manner. Third, the coefficients to combine the histogram divergences are learnt by minimizing the recognition error once a new feature is added to the classifier. This contrasts conventional AdaBoost where the coefficients are empirically set. Because of these properties, KLBoosting classifier generalizes very well. Moreover, to apply KLBoosting to high-dimensional image space, we propose a data-driven Kullback-Leibler Analysis (KLA) approach to find KL features for image objects (e.g., face patches). Promising experimental results on face detection demonstrate the effectiveness of KLBoosting.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Comparison of Kullback-Leibler, Hellinger and LINEX with Quadratic Loss Function in Bayesian Dynamic Linear Models: Forecasting of Real Price of Oil

In this paper we intend to examine the application of Kullback-Leibler, Hellinger and LINEX loss function in Dynamic Linear Model using the real price of oil for 106 years of data from 1913 to 2018 concerning the asymmetric problem in filtering and forecasting. We use DLM form of the basic Hoteling Model under Quadratic loss function, Kullback-Leibler, Hellinger and LINEX trying to address the ...

متن کامل

Using Kullback-Leibler distance for performance evaluation of search designs

This paper considers the search problem, introduced by Srivastava cite{Sr}. This is a model discrimination problem. In the context of search linear models, discrimination ability of search designs has been studied by several researchers. Some criteria have been developed to measure this capability, however, they are restricted in a sense of being able to work for searching only one possibl...

متن کامل

Model Confidence Set Based on Kullback-Leibler Divergence Distance

Consider the problem of estimating true density, h(.) based upon a random sample X1,…, Xn. In general, h(.)is approximated using an appropriate in some sense, see below) model fƟ(x). This article using Vuong's (1989) test along with a collection of k(> 2) non-nested models constructs a set of appropriate models, say model confidence set, for unknown model h(.).Application of such confide...

متن کامل

Local Parametric Modeling via U-Divergence

This paper discusses a local parametric modeling by the use of U-divergence in a statistical pattern recognition. The class of U-divergence measures commonly has an empirical loss function in a simple form including Kullback-Leibler divergence, the power divergence and mean squared error. We propose a minimization algorithm for parametric models of sequentially increasing dimension by incorpora...

متن کامل

Boosting Learning Algorithm for Pattern Recognition and Beyond

This paper discusses recent developments for pattern recognition focusing on boosting approach in machine learning. The statistical properties such as Bayes risk consistency for several loss functions are discussed in a probabilistic framework. There are a number of loss functions proposed for different purposes and targets. A unified derivation is given by a generator function U which naturall...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2003